SP22: Machine Learning for Signal Processing - Assignment 4

Name : Akhila Sakiramolla (asakiram@iu.edu)

UID : 2000886005

Importing required libraries

P1: kNN Source Separation

From the above audio we observe that the noise is suppressed and the audio is denoised to " She had your dark suit greasy wash water all year".

P2: Motor Imagery

I evaluated Knn model for different values of k ranging from 3 to 17 and different values of L raning from 700 to 730. I was able to observe an optimal accuracy of 78.6% for the case of k = 5 and L ranging between 716 and 718.

Compared to the accuracy plot made using Naive Bayes classifier, we observe there is a drop in performance when classified using Knn. But the highest accuracy difference is not too high. The drop in accuracy maybe caused because of the following reasons:

  1. Knn needs to compute similarity measure for every data point which makes it a lazy classifer and difficult to use on large datasets. Whereas, Naive Bayes is an eager learning classifier and is much faster compared to Knn.
  2. Knn has only one hyperparamter to tune for smoothing - k, which is the number of neighbors and obtaining an optimal k is a challenege as small k has high influence of noise and large k is computationally expensive to calculate. Whereas, Naive bayes has two hyperparameters - alpha and beta.
  3. Knn is a parametric model, with number of parameters fixed, whereas Naive bayes is a non-parametric model, with the ability to increase parameters with increase in sample.

P3: Multidimensional Scaling

The scatter plot plotted with the points is the IU logo.